翻訳と辞書 |
Minimum Fisher information : ウィキペディア英語版 | Minimum Fisher information
In information theory, the principle of minimum Fisher information (MFI) is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system. (See also Fisher information.) == Measures of information == Information measures (IM) are the most important tools of information theory. They measure either the amount of positive information or of "missing" information an observer possesses with regards to any system of interest. The most famous IM is the so-called Shannon-entropy (1948), which determines how much additional information the observer still requires in order to have all the available knowledge regarding a given system S, when all he/she has is a probability density function (PD) defined on appropriate elements of such system. This is then a "missing" information measure. The IM is a function of the PD only. If the observer does not have such a PD, but only a finite set of empirically determined mean values of the system, then a fundamental scientific principle called the Maximum Entropy one (MaxEnt) asserts that the "best" PD is the one that, reproducing the known expectation values, maximizes otherwise Shannon's IM.
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Minimum Fisher information」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|